Initializing Entity Representations in Relational Models

نویسندگان

  • Teng Long
  • Ryan Lowe
  • Jackie Cheung
  • Doina Precup
چکیده

Recent work in learning vector-space embeddings for multi-relational data has focused on combining relational information derived from knowledge bases with distributional information derived from large text corpora. We propose a simple trick that leverages the descriptions of entities or phrases available in lexical resources, in conjunction with distributional semantics, in order to derive a better initialization for training relational models. Applying this trick to the TransE model results in faster convergence of the entity representations, and achieves small improvements on Freebase for raw mean rank. More surprisingly, it results in significant new state-of-the-art performances on the WordNet dataset, decreasing the mean rank from the previous best 212 to 51. We find that there is a trade-off between improving the mean rank and the hits@10 with this approach. This illustrates that much remains to be understood regarding performance improvements in relational models.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning New Facts From Knowledge Bases With Neural Tensor Networks and Semantic Word Vectors

Knowledge bases provide applications with the benefit of easily accessible, systematic relational knowledge but often suffer in practice from their incompleteness and lack of knowledge of new entities and relations. Much work has focused on building or extending them by finding patterns in large unannotated text corpora. In contrast, here we mainly aim to complete a knowledge base by predicting...

متن کامل

RelTextRank: An Open Source Framework for Building Relational Syntactic-Semantic Text Pair Representations

We present a highly-flexible UIMA-based pipeline for developing structural kernelbased systems for relational learning from text, i.e., for generating training and test data for ranking, classifying short text pairs or measuring similarity between pieces of text. For example, the proposed pipeline can represent an input question and answer sentence pairs as syntacticsemantic structures, enrichi...

متن کامل

Learning Multi-Relational Semantics Using Neural-Embedding Models

Real-world entities (e.g., people and places) are often connected via relations, forming multirelational data. Modeling multi-relational data is important in many research areas, from natural language processing to biological data mining [6]. Prior work on multi-relational learning can be categorized into three categories: (1) statistical relational learning (SRL) [10], such as Markovlogic netw...

متن کامل

kLog: A Language for Logical and Relational Learning with Kernels (Extended Abstract)

We introduce a novel approach to statistical relational learning; it is incorporated in the logical and relational learning language, kLog. While traditionally statistical relational learning combines probabilistic (graphical) models with logical and relational representations, kLog combines a kernelbased approach with expressive logical and relational representations. kLog allows users to spec...

متن کامل

On the Design and Maintenance of Optimized Relational Representations of Entity-Relationship Schemas

A method for obtaining optimized relational representations of database conceptual schemas in an extended entity-relationship model is first proposed. The method incorporates and generalizes a familiar heuristics to obtain good relational representations and also produces, for each relational structure, an explanation indicating which concepts it represents. Then, a redesign method that, given ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016